On Asymptotics of Eigenvectors of Large Sample Covariance Matrix
نویسندگان
چکیده
Let {Xij}, i, j = . . . , be a double array of i.i.d. complex random variables with EX11 = 0,E|X11| 2 = 1 and E|X11| 4 <∞, and let An = 1 N T 1/2 n XnX ∗ nT 1/2 n , where T 1/2 n is the square root of a nonnegative definite matrix Tn and Xn is the n×N matrix of the upper-left corner of the double array. The matrix An can be considered as a sample covariance matrix of an i.i.d. sample from a population with mean zero and covariance matrix Tn, or as a multivariate F matrix if Tn is the inverse of another sample covariance matrix. To investigate the limiting behavior of the eigenvectors of An, a new form of empirical spectral distribution is defined with weights defined by eigenvectors and it is then shown that this has the same limiting spectral distribution as the empirical spectral distribution defined by equal weights. Moreover, if {Xij} and Tn are either real or complex and some additional moment assumptions are made then linear spectral statistics defined by the eigenvectors of An are proved to have Gaussian limits, which suggests that the eigenvector matrix of An is nearly Haar distributed when Tn is a multiple of the identity matrix, an easy consequence for a Wishart matrix.
منابع مشابه
Asymptotics of Sample Eigenstructure for a Large Dimensional Spiked Covariance Model
This paper deals with a multivariate Gaussian observation model where the eigenvalues of the covariance matrix are all one, except for a finite number which are larger. Of interest is the asymptotic behavior of the eigenvalues of the sample covariance matrix when the sample size and the dimension of the observations both grow to infinity so that their ratio converges to a positive constant. Whe...
متن کاملEIGENVECTORS OF COVARIANCE MATRIX FOR OPTIMAL DESIGN OF STEEL FRAMES
In this paper, the discrete method of eigenvectors of covariance matrix has been used to weight minimization of steel frame structures. Eigenvectors of Covariance Matrix (ECM) algorithm is a robust and iterative method for solving optimization problems and is inspired by the CMA-ES method. Both of these methods use covariance matrix in the optimization process, but the covariance matrix calcula...
متن کاملAsymptotics of the leading sample eigenvalues for a spiked covariance model
We consider a multivariate Gaussian observation model where the covariance matrix is diagonal and the diagonal entries are all equal to one except for a finite number which are bigger. We address the question of asymptotic behaviour of the eigenvalues of the sample covariance matrix when the sample size and the dimension of the observations both grow to infinity in such a way that their ratio c...
متن کاملOptimizing the Input Covariance for MIMO Channels
A multivariate Gaussian input density achieves capacity for multiple-input multiple-output flat fading channels with additive white Gaussian noise and perfect receiver channel state information. Capacity computation for these channels reduces to the problem of finding the best input covariance matrix, and is in general a convex semi-definite program. This paper presents Kuhn-Tucker optimality c...
متن کاملOptimal estimation of a large-dimensional covariance matrix under Stein's loss
This paper revisits the methodology of Stein (1975, 1986) for estimating a covariance matrix in the setting where the number of variables can be of the same magnitude as the sample size. Stein proposed to keep the eigenvectors of the sample covariance matrix but to shrink the eigenvalues. By minimizing an unbiased estimator of risk, Stein derived an ‘optimal’ shrinkage transformation. Unfortuna...
متن کامل